Regularization (machine learning)

Any mechanism that reduces overfitting. Popular types of regularization include:1

Regularization can also be defined as the penalty on a model’s complexity.

Training loss vs. real-world performance

Regularization is counterintuitive. Increasing regularization usually increases training loss, which is confusing because, well, isn’t the goal to minimize training loss?1

Actually, no. The goal isn’t to minimize training loss. The goal is to make excellent predictions on real-world examples. Remarkably, even though increasing regularization increases training loss, it usually helps models make better predictions on real-world examples.

Footnotes

  1. developers.google.com/machine-learning/glossary#regularization 2

2024 © ak